Skip to main content

LLM Adapter

Overview

  • Pluggable LLM adapter abstracts provider specifics (OpenAI, Ollama, etc.).
  • Module AiChatModule now uses LlmAdapterFactory to route calls.

Extending

  • Implement LlmAdapter returning ChatCompletionResponse.
  • Register in LlmAdapterFactory for a new LlmDetails.ProviderEnum value.

Configuration

  • openAiApiKey, openAiModel, ollamaModelUrl, and temperature are typical fields on AiChatModule.
  • Use moduleData to store per-step overrides when needed.